Week 8B: Raster Data in the Wild

Oct 22, 2020

This week: raster data analysis with the holoviz ecosystem

Two case studies:

  • Last time: Using satellite imagery to detect changes in lake volume
  • Today: Detecting urban heat islands in Philadelphia

Initialize our packages:

In [4]:
from matplotlib import pyplot as plt
import numpy as np
import pandas as pd
import geopandas as gpd
%matplotlib inline
In [6]:
import intake
import xarray as xr
import cartopy.crs as ccrs
In [7]:
import hvplot.xarray
import hvplot.pandas
import holoviews as hv
import geoviews as gv
from geoviews.tile_sources import EsriImagery
hv.extension('bokeh')

Create the intake data catalog so we can load our datasets:

In [8]:
url = 'https://raw.githubusercontent.com/pyviz-topics/EarthML/master/examples/catalog.yml'
cat = intake.open_catalog(url)

Example #2: Urban heat islands

First load some metadata for Landsat 8

In [9]:
band_info = pd.DataFrame([
        (1,  "Aerosol", " 0.43 - 0.45",    0.440,  "30",         "Coastal aerosol"),
        (2,  "Blue",    " 0.45 - 0.51",    0.480,  "30",         "Blue"),
        (3,  "Green",   " 0.53 - 0.59",    0.560,  "30",         "Green"),
        (4,  "Red",     " 0.64 - 0.67",    0.655,  "30",         "Red"),
        (5,  "NIR",     " 0.85 - 0.88",    0.865,  "30",         "Near Infrared (NIR)"),
        (6,  "SWIR1",   " 1.57 - 1.65",    1.610,  "30",         "Shortwave Infrared (SWIR) 1"),
        (7,  "SWIR2",   " 2.11 - 2.29",    2.200,  "30",         "Shortwave Infrared (SWIR) 2"),
        (8,  "Panc",    " 0.50 - 0.68",    0.590,  "15",         "Panchromatic"),
        (9,  "Cirrus",  " 1.36 - 1.38",    1.370,  "30",         "Cirrus"),
        (10, "TIRS1",   "10.60 - 11.19",   10.895, "100 * (30)", "Thermal Infrared (TIRS) 1"),
        (11, "TIRS2",   "11.50 - 12.51",   12.005, "100 * (30)", "Thermal Infrared (TIRS) 2")],
    columns=['Band', 'Name', 'Wavelength Range (µm)', 'Nominal Wavelength (µm)', 'Resolution (m)', 'Description']).set_index(["Band"])
band_info
Out[9]:
Name Wavelength Range (µm) Nominal Wavelength (µm) Resolution (m) Description
Band
1 Aerosol 0.43 - 0.45 0.440 30 Coastal aerosol
2 Blue 0.45 - 0.51 0.480 30 Blue
3 Green 0.53 - 0.59 0.560 30 Green
4 Red 0.64 - 0.67 0.655 30 Red
5 NIR 0.85 - 0.88 0.865 30 Near Infrared (NIR)
6 SWIR1 1.57 - 1.65 1.610 30 Shortwave Infrared (SWIR) 1
7 SWIR2 2.11 - 2.29 2.200 30 Shortwave Infrared (SWIR) 2
8 Panc 0.50 - 0.68 0.590 15 Panchromatic
9 Cirrus 1.36 - 1.38 1.370 30 Cirrus
10 TIRS1 10.60 - 11.19 10.895 100 * (30) Thermal Infrared (TIRS) 1
11 TIRS2 11.50 - 12.51 12.005 100 * (30) Thermal Infrared (TIRS) 2

Landsat data on Google Cloud Storage

We'll be downloading publicly available Landsat data from Google Cloud Storage

See: https://cloud.google.com/storage/docs/public-datasets/landsat

The relevant information is stored in our intake catalog:

From our catalog.yml file:

google_landsat_band:
    description: Landsat bands from Google Cloud Storage
    driver: rasterio
    parameters:
      path:
        description: landsat path
        type: int
      row:
        description: landsat row
        type: int
      product_id:
        description: landsat file id
        type: str
      band:
        description: band
        type: int
    args:
      urlpath: https://storage.googleapis.com/gcp-public-data-landsat/LC08/01/{{ '%03d' % path }}/{{ '%03d' % row }}/{{ product_id }}/{{ product_id }}_B{{ band }}.TIF
      chunks:
        band: 1
        x: 256
        y: 256

From the "urlpath" above, you can see we need "path", "row", and "product_id" variables to properly identify a Landsat image:

The path and row corresponding to the area over Philadelphia has already been selected using the Earth Explorer. This tool was also used to find the id of the particular date of interest using the same tool.

In [10]:
# Necessary variables
path = 14
row = 32
product_id = 'LC08_L1TP_014032_20160727_20170222_01_T1'

Use a utility function to query the API and request a specific band

This will return a specific Landsat band as a dask array.

In [11]:
from random import random
from time import sleep


def get_band_with_exponential_backoff(path, row, product_id, band, maximum_backoff=32):
    """
    Google Cloud Storage recommends using exponential backoff 
    when accessing the API. 
    
    https://cloud.google.com/storage/docs/exponential-backoff
    """
    n = backoff = 0
    while backoff < maximum_backoff:
        try:
            return cat.google_landsat_band(
                path=path, row=row, product_id=product_id, band=band
            ).to_dask()
        except:
            backoff = min(2 ** n + random(), maximum_backoff)
            sleep(backoff)
            n += 1

Load all of the bands and combine them into a single xarray DataArray

Loop over each band, load that band using the above function, and then concatenate the results together..

In [12]:
bands = [1, 2, 3, 4, 5, 6, 7, 9, 10, 11]

datasets = []
for band in bands:
    da = get_band_with_exponential_backoff(
        path=path, row=row, product_id=product_id, band=band
    )
    da = da.assign_coords(band=[band])
    datasets.append(da)

ds = xr.concat(datasets, "band", compat="identical")

ds
Out[12]:
<xarray.DataArray (band: 10, y: 7871, x: 7741)>
dask.array<concatenate, shape=(10, 7871, 7741), dtype=uint16, chunksize=(1, 256, 256), chunktype=numpy.ndarray>
Coordinates:
  * y        (y) float64 4.582e+06 4.582e+06 4.582e+06 ... 4.346e+06 4.346e+06
  * x        (x) float64 3.954e+05 3.954e+05 3.955e+05 ... 6.276e+05 6.276e+05
  * band     (band) int64 1 2 3 4 5 6 7 9 10 11
Attributes:
    transform:      (30.0, 0.0, 395385.0, 0.0, -30.0, 4582215.0)
    crs:            +init=epsg:32618
    res:            (30.0, 30.0)
    is_tiled:       1
    nodatavals:     (nan,)
    scales:         (1.0,)
    offsets:        (0.0,)
    AREA_OR_POINT:  Point

Important: CRS info

CRS is EPSG=32618

Also grab the metadata from Google Cloud Storage

  • There is an associated metadata file stored on Google Cloud Storage
  • The below function will parse that metadata file for a specific path, row, and product ID
  • The specifics of this are not overly important for our purposes
In [13]:
def load_google_landsat_metadata(path, row, product_id):
    """
    Load Landsat metadata for path, row, product_id from Google Cloud Storage
    """

    def parse_type(x):
        try:
            return eval(x)
        except:
            return x

    baseurl = "https://storage.googleapis.com/gcp-public-data-landsat/LC08/01"
    suffix = f"{path:03d}/{row:03d}/{product_id}/{product_id}_MTL.txt"

    df = intake.open_csv(
        urlpath=f"{baseurl}/{suffix}",
        csv_kwargs={
            "sep": "=",
            "header": None,
            "names": ["index", "value"],
            "skiprows": 2,
            "converters": {"index": (lambda x: x.strip()), "value": parse_type},
        },
    ).read()
    metadata = df.set_index("index")["value"]
    return metadata
In [14]:
metadata = load_google_landsat_metadata(path, row, product_id)
metadata.head()
Out[14]:
index
ORIGIN                Image courtesy of the U.S. Geological Survey
REQUEST_ID                                     0501702206266_00020
LANDSAT_SCENE_ID                             LC80140322016209LGN01
LANDSAT_PRODUCT_ID        LC08_L1TP_014032_20160727_20170222_01_T1
COLLECTION_NUMBER                                               01
Name: value, dtype: object

Let's trim our data to the Philadelphia limits

  • The Landsat image covers an area much wider than the Philadelphia limits
  • We'll load the city limits from Open Data Philly
  • Use the slice() function discussed last lecture

Load the city limits

In [17]:
# Load the GeoJSON from the URL
url = "http://data.phl.opendata.arcgis.com/datasets/405ec3da942d4e20869d4e1449a2be48_0.geojson"
city_limits = gpd.read_file(url)

# Convert to the right CRS for this data
city_limits = city_limits.to_crs(epsg=32618)

Use the xmin/xmax and ymin/ymax of the city limits to trim the Landsat DataArray

  • Use the built-in slice functionality of xarray
  • Remember, the y coordinates are in descending order, so you'll slice will need to be in descending order as well
In [24]:
# Look up the order of xmin/xmax and ymin/ymax
city_limits.total_bounds?
In [18]:
# Get x/y range of city limits from "total_bounds"
xmin, ymin, xmax, ymax = city_limits.total_bounds
In [19]:
# Slice our xarray data
subset = ds.sel(
                x=slice(xmin, xmax), 
                y=slice(ymax, ymin)
               ) # NOTE: y coordinate system is in descending order!

Call the compute() function on your sliced data

  • Originally, the Landsat data was stored as dask arrays
  • This should now only load the data into memory that covers Philadelphia
In [20]:
subset = subset.compute()

subset
Out[20]:
<xarray.DataArray (band: 10, y: 1000, x: 924)>
array([[[10702, 10162, 10361, ..., 11681, 11489, 15594],
        [10870, 10376, 10122, ..., 11620, 11477, 12042],
        [10429, 10147, 10063, ..., 11455, 11876, 11790],
        ...,
        [11944, 11696, 11626, ..., 11192, 11404, 10923],
        [12406, 11555, 11155, ..., 11516, 11959, 10766],
        [11701, 11232, 10657, ..., 10515, 11475, 10926]],

       [[ 9809,  9147,  9390, ..., 10848, 10702, 15408],
        [ 9989,  9405,  9139, ..., 10831, 10660, 11361],
        [ 9439,  9083,  8981, ..., 10654, 11141, 11073],
        ...,
        [11220, 10853, 10741, ..., 10318, 10511,  9950],
        [11765, 10743, 10259, ..., 10646, 11378,  9829],
        [10889, 10365,  9630, ...,  9500, 10573, 10008]],

       [[ 9042,  8466,  8889, ..., 10014,  9647, 14981],
        [ 9699,  8714,  8596, ...,  9866,  9783, 11186],
        [ 8623,  8457,  8334, ...,  9688, 10474,  9993],
        ...,
...
        ...,
        [ 5027,  5028,  5038, ...,  5035,  5037,  5029],
        [ 5058,  5021,  5023, ...,  5035,  5041,  5031],
        [ 5036,  5041,  5040, ...,  5036,  5044,  5035]],

       [[29033, 28976, 28913, ..., 32614, 32577, 32501],
        [28940, 28904, 28858, ..., 32516, 32545, 32545],
        [28882, 28879, 28854, ..., 32431, 32527, 32563],
        ...,
        [30094, 29929, 29713, ..., 29521, 29525, 29429],
        [30140, 29972, 29696, ..., 29556, 29516, 29398],
        [30133, 29960, 29614, ..., 29587, 29533, 29424]],

       [[25558, 25519, 25492, ..., 27680, 27650, 27619],
        [25503, 25450, 25402, ..., 27636, 27630, 27639],
        [25473, 25434, 25378, ..., 27609, 27668, 27667],
        ...,
        [26126, 26055, 25934, ..., 25622, 25586, 25520],
        [26149, 26077, 25935, ..., 25651, 25594, 25507],
        [26147, 26050, 25856, ..., 25696, 25644, 25571]]], dtype=uint16)
Coordinates:
  * y        (y) float64 4.443e+06 4.443e+06 4.443e+06 ... 4.413e+06 4.413e+06
  * x        (x) float64 4.761e+05 4.761e+05 4.761e+05 ... 5.037e+05 5.038e+05
  * band     (band) int64 1 2 3 4 5 6 7 9 10 11
Attributes:
    transform:      (30.0, 0.0, 395385.0, 0.0, -30.0, 4582215.0)
    crs:            +init=epsg:32618
    res:            (30.0, 30.0)
    is_tiled:       1
    nodatavals:     (nan,)
    scales:         (1.0,)
    offsets:        (0.0,)
    AREA_OR_POINT:  Point
In [21]:
# Original data was about 8000 x 8000 in size
ds.shape
Out[21]:
(10, 7871, 7741)
In [22]:
# Sliced data is only about 1000 x 1000 in size!
subset.shape
Out[22]:
(10, 1000, 924)

Let's plot band 3 of the Landsat data

In [23]:
# City limits plot
limits_plot = city_limits.hvplot.polygons(
    line_color="white", alpha=0, line_alpha=1, crs=32618, geo=True
)

# Landsat plot
landsat_plot = subset.sel(band=3).hvplot.image(
    x="x",
    y="y",
    rasterize=True,
    cmap="viridis",
    frame_width=500,
    frame_height=500,
    geo=True,
    crs=32618,
)

# Combined
landsat_plot * limits_plot
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[23]:

Calculate the NDVI

Remember, NDVI = (NIR - Red) / (NIR + Red)

In [24]:
band_info.head()
Out[24]:
Name Wavelength Range (µm) Nominal Wavelength (µm) Resolution (m) Description
Band
1 Aerosol 0.43 - 0.45 0.440 30 Coastal aerosol
2 Blue 0.45 - 0.51 0.480 30 Blue
3 Green 0.53 - 0.59 0.560 30 Green
4 Red 0.64 - 0.67 0.655 30 Red
5 NIR 0.85 - 0.88 0.865 30 Near Infrared (NIR)

NIR is band 5 and Red is band 4

In [25]:
# Selet the bands we need
NIR = subset.sel(band=5)
RED = subset.sel(band=4)

# Calculate the NDVI
NDVI = (NIR - RED) / (NIR + RED)
NDVI = NDVI.where(NDVI < np.inf)

NDVI.head()
Out[25]:
<xarray.DataArray (y: 5, x: 5)>
array([[0.34690712, 0.43934922, 0.45820226, 0.45798014, 0.38205128],
       [0.33747045, 0.40266976, 0.47276229, 0.44722264, 0.4530777 ],
       [0.40783302, 0.44093268, 0.47419128, 0.49448025, 0.49341935],
       [0.40190311, 0.39986498, 0.46889397, 0.48497283, 0.48686142],
       [0.47248631, 0.47255625, 0.46365524, 0.37692424, 0.36055777]])
Coordinates:
  * y        (y) float64 4.443e+06 4.443e+06 4.443e+06 4.443e+06 4.443e+06
  * x        (x) float64 4.761e+05 4.761e+05 4.761e+05 4.762e+05 4.762e+05
In [26]:
# NEW: Use datashader to plot the NDVI
p = NDVI.hvplot.image(
    x="x",
    y="y",
    geo=True,
    crs=32618,
    datashade=True, # NEW: USE DATASHADER
    project=True, # NEW: PROJECT THE DATA BEFORE PLOTTING
    frame_height=500,
    frame_width=500,
    cmap="viridis",
)

# NEW: Use a transparent rasterized version of the plot for hover text
# No hover text available with datashaded images so we can use the rasterized version
p_hover = NDVI.hvplot(
    x="x",
    y="y",
    crs=32618,
    geo=True,
    rasterize=True,
    frame_height=500,
    frame_width=500,
    alpha=0, # SET ALPHA=0 TO MAKE THIS TRANSPARENT
    colorbar=False,
)

# City limits
limits_plot = city_limits.hvplot.polygons(
    geo=True, crs=32618, line_color="white", line_width=2, alpha=0, line_alpha=1
)

p * p_hover * limits_plot
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[26]:

A couple of notes

  • Notice that we leveraged datashader algorithm to plot the NDVI by specifying the datashade=True keyword
  • Hover tools won't work with datashaded images — instead, we overlaid a transparent rasterized version of the image, which shows the mean pixel values across the image

Now, on to land surface temperature

  • Given the NDVI calculated above, we can determine land surface temperature
  • But, the data must be cleaned first! Several atmospheric corrections and transformations must be applied.
  • We'll use existing an existing package called rio_toa to do this
  • We'll also need to specify one more for transforming satellite temperature (brightness temp) to land surface temperature.
  • For these calculations we'll use both Thermal Infrared bands - 10 and 11.
In [27]:
from rio_toa import brightness_temp, toa_utils
In [28]:
def land_surface_temp(BT, w, NDVI):
    """Calculate land surface temperature of Landsat 8
    
    temp = BT/1 + w * (BT /p) * ln(e)
    
    BT = At Satellite temperature (brightness)
    w = wavelength of emitted radiance (μm)
    
    where p = h * c / s (1.439e-2 mK)
    
    h = Planck's constant (Js)
    s = Boltzmann constant (J/K)
    c = velocity of light (m/s)
    """
    h = 6.626e-34
    s = 1.38e-23
    c = 2.998e8

    p = (h * c / s) * 1e6

    Pv = (NDVI - NDVI.min() / NDVI.max() - NDVI.min()) ** 2
    e = 0.004 * Pv + 0.986

    return BT / 1 + w * (BT / p) * np.log(e)
In [30]:
def land_surface_temp_for_band(band):
    """
    Utility function to get land surface temperature for a specific band
    """
    # params from general Landsat info
    w = band_info.loc[band]["Nominal Wavelength (µm)"]

    # params from specific Landsat data text file for these images
    ML = metadata[f"RADIANCE_MULT_BAND_{band}"]
    AL = metadata[f"RADIANCE_ADD_BAND_{band}"]
    K1 = metadata[f"K1_CONSTANT_BAND_{band}"]
    K2 = metadata[f"K2_CONSTANT_BAND_{band}"]

    at_satellite_temp = brightness_temp.brightness_temp(
        subset.sel(band=band).values, ML, AL, K1, K2
    )

    return land_surface_temp(at_satellite_temp, w, NDVI)

Get land surface temp for band 10

In [31]:
# temperature for band 10
band = 10
temp_10 = land_surface_temp_for_band(band).compute()

# convert to Farenheit
temp_10_f = toa_utils.temp_rescale(temp_10, 'F')

# Save the numpy array as an xarray
band_10 = subset.sel(band=band).copy(data=temp_10_f)

Get the land surface temp for band 11

In [32]:
# Get the land surface temp
band = 11
temp_11 = land_surface_temp_for_band(band).compute()

# Convert to Farenheit
temp_11_f = toa_utils.temp_rescale(temp_11, "F")

# Save as an xarray DataArray
band_11 = subset.sel(band=band).copy(data=temp_11_f)

Plot both temperatures side-by-side

In [33]:
# plot for band 10
plot_10 = band_10.hvplot.image(
    x="x",
    y="y",
    rasterize=True,
    cmap="fire_r",
    crs=32618,
    geo=True,
    frame_height=400,
    frame_width=400,
    title="Band 10",
)

# plot for band 11
plot_11 = band_11.hvplot.image(
    x="x",
    y="y",
    rasterize=True,
    cmap="fire_r",
    crs=32618,
    geo=True,
    frame_height=400,
    frame_width=400,
    title="Band 11",
)

plot_10 + plot_11
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[33]:

Plot the final product (the average of the bands)

In [34]:
# Let's average the results of these two bands
mean_temp_f = (band_10 + band_11) / 2

# IMPORTANT: copy over the metadata
mean_temp_f.attrs = band_10.attrs
In [35]:
mean_temp_f
Out[35]:
<xarray.DataArray (y: 1000, x: 924)>
array([[79.40579086, 79.18871976, 78.98910218, ..., 91.86524855,
        91.72038357, 91.49660753],
       [79.07306227, 78.86416093, 78.64685031, ..., 91.56712235,
        91.60934815, 91.6311314 ],
       [78.87625297, 78.77157573, 78.57777687, ..., 91.33527213,
        91.66605844, 91.73405961],
       ...,
       [83.01805262, 82.50340008, 81.75768196, ..., 80.579109  ,
        80.49671178, 80.13207064],
       [83.16916028, 82.64632139, 81.7252    , ..., 80.72431108,
        80.4982923 , 80.03521661],
       [83.14992768, 82.55447768, 81.35868618, ..., 80.90148737,
        80.65920451, 80.25022931]])
Coordinates:
  * y        (y) float64 4.443e+06 4.443e+06 4.443e+06 ... 4.413e+06 4.413e+06
  * x        (x) float64 4.761e+05 4.761e+05 4.761e+05 ... 5.037e+05 5.038e+05
Attributes:
    transform:      (30.0, 0.0, 395385.0, 0.0, -30.0, 4582215.0)
    crs:            +init=epsg:32618
    res:            (30.0, 30.0)
    is_tiled:       1
    nodatavals:     (nan,)
    scales:         (1.0,)
    offsets:        (0.0,)
    AREA_OR_POINT:  Point
In [36]:
p = mean_temp_f.hvplot.image(
    x="x",
    y="y",
    rasterize=True,
    crs=32618,
    geo=True,
    frame_width=600,
    frame_height=600,
    cmap="rainbow",
    alpha=0.5,
    legend=False,
    title="Mean Surface Temp (F)"
)

img = p * EsriImagery

img
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[36]:

Use the zoom tool to identify hot spots

Key insight: color of the building roof plays a very important role in urban heat islands

Exercise: isolate hot spots on the map

We can use the .where() function in xarray to identify the heat islands across the city

To do:

  • Select only those pixels with mean temperatures about 90 degrees.
  • Remember, the where() function takes a boolean array where each pixel is True/False based on the selection criteria
  • Use hvplot to plot the imagery for Philadelphia, with the hotspots overlaid
In [ ]:
 

What about cold spots?

Let's select things less than 75 degrees as well...

In [ ]:
 

Exercise: Exploring the relationship between temperature and parks/trees

In this section, we'll use zonal statistics to calculate the average temperatures for neighborhoods and parks, and compare this to the number of street trees in the city.

Saving xarray data to a geoTIFF file

  • Unfortunately, the zonal statistics calculation requires a rasterio file — so, we will need to to save our mean temperature xarray DataArray to a .tif file
  • This is not built in to xarray (yet), but we can use the rioxarray package
In [41]:
import rioxarray
In [42]:
# Save to a raster GeoTIFF file!
mean_temp_f.rio.to_raster("./mean_temp_f.tif")
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/pyproj/crs/crs.py:280: FutureWarning: '+init=<authority>:<code>' syntax is deprecated. '<authority>:<code>' is the preferred initialization method. When making the change, be mindful of axis order changes: https://pyproj4.github.io/pyproj/stable/gotchas.html#axis-order-changes-in-proj-6
  projstring = _prepare_from_string(projparams)

Part 1.1: Calculate the mean temperature for parks in Philadelphia

  • The parks dataset is available in the data folder: ./data/parks.geojson
  • Use the rasterstats package to calculate the zonal stats
  • Look back at week 4's lecture for help!
  • Make sure the CRS matches!
In [43]:
from rasterstats import zonal_stats
import rasterio as rio
In [ ]:
 

Part 1.2: Plot a histogram of temperature values for each park

  • Before plotting the histogram, subtract out the mean temperature of the entire city
  • This will quickly let us see which parks have average temperatures above or below the citywide average
In [ ]:
 
In [ ]:
 

Part 2.1: Calculate the average temperature in Philadelphia's neighborhoods

In [ ]:
 

Part 2.2: Plot a choropleth map of the neighborhoods, coloring by temperature

  • Make a static chart with Geopandas
  • Make an interactive chart with Altair / hvplot, so we can quickly identify the neighborhoods with the highest/lowest temperatures!

Plot using geopandas

In [ ]:
 

Plot using altair/hvplot

In [ ]:
 

Part 2.3: Calculate the number of street trees per neighborhood

In [ ]:
 
In [ ]:
 

Part 2.4: Use seaborn to explore the relationship b/w street trees and temperature

  • Use the kdeplot() function to plot the relationship between the number of trees per neighborhood and the mean temperature per neighborhood
  • You will need to merge() the relevant data frames together based on the neighborhood name

What are the major trends? Do you see the relationship you expect?"

In [73]:
import seaborn as sns
In [ ]: